Some asymptotics for local least-squares regression with regularization

نویسندگان

  • Jürgen Franke
  • Joachim Weickert
چکیده

We derive some asymptotics for a new approach to curve estimation proposed by Mrázek et al. [3] which combines localization and regularization. This methodology has been considered as the basis of a unified framework covering various different smoothing methods in the analogous two-dimensional problem of image denoising. As a first step for understanding this approach theoretically, we restrict our discussion here to the least-squares where we have explicit formulas for the function estimates and where we can derive a rather complete asymptotic theory from known results for the Priestley-Chao curve estimate.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Local regularization assisted orthogonal least squares regression

A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares model selection to produce a very sparse model with good generalization performance is g...

متن کامل

5 Approximate Nearest Neighbor Regression in Very High Dimensions

Fast and approximate nearest-neighbor search methods have recently become popular for scaling nonparameteric regression to more complex and high-dimensional applications. As an alternative to fast nearest neighbor search, training data can also be incorporated online into appropriate sufficient statistics and adaptive data structures, such that approximate nearestneighbor predictions can be acc...

متن کامل

Asymptotics of Gaussian Regularized Least Squares

We consider regularized least-squares (RLS) with a Gaussian kernel. We prove that if we let the Gaussian bandwidth σ → ∞ while letting the regularization parameter λ→ 0, the RLS solution tends to a polynomial whose order is controlled by the rielative rates of decay of 1 σ2 and λ: if λ = σ−(2k+1), then, as σ →∞, the RLS solution tends to the kth order polynomial with minimal empirical error. We...

متن کامل

Local Asymptotics for Polynomial Spline Regression

In this paper we develop a general theory of local asymptotics for least squares estimates over polynomial spline spaces in a regression problem. The polynomial spline spaces we consider include univariate splines, tensor product splines, and bivariate or multivariate splines on triangulations. We establish asymptotic normality of the estimate and study the magnitude of the bias due to spline a...

متن کامل

Nonlinear Cointegrating Regression under Weak Identification

An asymptotic theory is developed for a weakly identified cointegrating regression model in which the regressor is a nonlinear transformation of an integrated process. Weak identification arises from the presence of a loading coefficient for the nonlinear function that may be close to zero. In that case, standard nonlinear cointegrating limit theory does not provide good approximations to the f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007